翻訳と辞書
Words near each other
・ Adjustable grip hitch
・ Adjustable Ranging Telescope
・ Adjustable shelving
・ Adjustable spanner
・ Adjustable Wrench (Torke)
・ Adjustable-focus eyeglasses
・ Adjustable-rate mortgage
・ Adjustable-speed drive
・ Adjusted basis
・ Adjusted Compensation Payment Act
・ Adjusted cost base
・ Adjusted current yield
・ Adjusted ERA+
・ Adjusted gross income
・ Adjusted Gross Revenue Insurance
Adjusted mutual information
・ Adjusted Peak Performance
・ Adjusted present value
・ Adjusted winner procedure
・ Adjusted world price
・ Adjusters International
・ Adjustierung
・ Adjusting entries
・ Adjustment
・ Adjustment (law)
・ Adjustment (psychology)
・ Adjustment clause
・ Adjustment disorder
・ Adjustment handle
・ Adjustment of status


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Adjusted mutual information : ウィキペディア英語版
Adjusted mutual information
In probability theory and information theory, adjusted mutual information, a variation of mutual information may be used for comparing clusterings. It corrects the effect of agreement solely due to chance between clusterings, similar to the way the adjusted rand index corrects the Rand index. It is closely related to variation of information: when a similar adjustment is made to the VI index, it becomes equivalent to the AMI.〔 The adjusted measure however is no longer metrical.
==Mutual Information of two Partitions==
Given a set ''S'' of ''N'' elements S=\, consider two partitions of ''S'', namely U=\ with ''R'' clusters, and V=\ with ''C'' clusters. It is presumed here that the partitions are so-called ''hard clusters;'' the partitions are pairwise disjoint:
:U_i\cap U_j = V_i\cap V_j = \varnothing
for all i\ne j, and complete:
:\cup_^RU_i=\cup_^C V_j=S
The mutual information of cluster overlap between ''U'' and ''V'' can be summarized in the form of an ''R''x''C'' contingency table M=()^_, where n_ denotes the number of objects that are common to clusters U_i and V_j. That is,
:n_=\left|U_i\cap V_j\right|
Suppose an object is picked at random from ''S''; the probability that the object falls into cluster U_i is:
:P(i)=\frac
The entropy associated with the partitioning ''U'' is:
:H(U)=-\sum_^R P(i)\log P(i)
''H(U)'' is non-negative and takes the value 0 only when there is no uncertainty determining an object's cluster membership, ''i.e.'', when there is only one cluster. Similarly, the entropy of the clustering ''V'' can be calculated as:
:H(V)=-\sum_^C P'(j)\log P'(j)
where P'(j)=/. The mutual information (MI) between two partitions:
:MI(U,V)=\sum_^R \sum_^C P(i,j)\log \frac
where ''P(i,j)'' denotes the probability that a point belongs to both the cluster U_i in ''U'' and cluster V_j in ''V'':
:P(i,j)=\frac
MI is a non-negative quantity upper bounded by the entropies ''H''(''U'') and ''H''(''V''). It quantifies the information shared by the two clusterings and thus can be employed as a clustering similarity measure.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Adjusted mutual information」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.